# Dense Retrieval
Nomic Embed Multimodal 7b
Apache-2.0
A 7-billion-parameter multimodal embedding model specialized in visual document retrieval tasks, achieving outstanding performance on the Vidore-v2 benchmark
Text-to-Image Supports Multiple Languages
N
nomic-ai
741
26
Nomic Embed Multimodal 3b
Nomic Embed Multimodal 3B is a cutting-edge multimodal embedding model focused on visual document retrieval tasks, supporting unified text-image encoding, achieving an outstanding performance of 58.8 NDCG@5 in the Vidore-v2 test.
Text-to-Image Supports Multiple Languages
N
nomic-ai
3,431
11
Drama Base
DRAMA-base is a dense retrieval model based on a pruned large language model architecture, supporting multilingual text retrieval tasks.
Text Embedding
Transformers Supports Multiple Languages

D
facebook
716
15
Medcpt Query Encoder
Other
MedCPT is a model capable of generating biomedical text embeddings, particularly suitable for semantic search (dense retrieval) tasks.
Text Embedding
Transformers

M
ncbi
73.74k
40
Medcpt Article Encoder
Other
MedCPT is a model capable of generating biomedical text embeddings, particularly suitable for semantic search (dense retrieval) tasks.
Text Embedding
Transformers

M
ncbi
14.37k
24
Dragon Plus Context Encoder
DRAGON+ is a dense retrieval model based on the BERT architecture, employing an asymmetric dual-encoder architecture, suitable for text retrieval tasks.
Text Embedding
Transformers

D
facebook
4,396
39
Dragon Plus Query Encoder
DRAGON+ is a dense retrieval model based on the BERT architecture, with initial weights derived from RetroMAE and trained on enhanced data from the MS MARCO corpus.
Text Embedding
Transformers

D
facebook
3,918
20
T5 Ance
MIT
T5-ANCE is a dense retrieval model trained on the MS MARCO passage dataset, optimized with the T5 architecture and ANCE training process, suitable for information retrieval tasks.
Text Embedding
Transformers

T
OpenMatch
893
1
Spar Wiki Bm25 Lexmodel Query Encoder
A dense retriever based on BERT-base architecture, trained on Wikipedia articles to emulate BM25 behavior
Text Embedding
Transformers

S
facebook
80
2
Rankgen T5 Xl All
Apache-2.0
RankGen is a set of encoder models capable of mapping prefixes and generated content from pre-trained language models into a shared vector space to enhance generation quality and retrieval performance.
Large Language Model
Transformers English

R
kalpeshk2011
4,535
1
Scincl Wol
MIT
Scientific literature embedding model trained without SciDocs leakage data
Large Language Model
Transformers

S
malteos
127
0
Tct Colbert V2 Hnp Msmarco
TCT-ColBERT-V2 is a dense retrieval model based on the tightly-coupled teacher mechanism and in-batch negative sample knowledge distillation, designed for efficient text retrieval.
Text Embedding
Transformers

T
castorini
1,382
4
Tct Colbert V2 Msmarco
TCT-ColBERT-V2 is a dense retrieval model based on knowledge distillation, which improves retrieval efficiency and quality through tightly coupled teacher mechanisms and batch negative optimization training.
Text Embedding
Transformers

T
castorini
2,220
0
Contriever Msmarco
A fine-tuned version of the Contriever pre-trained model, optimized for dense information retrieval tasks and trained using contrastive learning methods
Text Embedding
Transformers

C
facebook
24.08k
27
Featured Recommended AI Models